Web Survey Bibliography
Relevance & Research Question: Open-ended questions are often used to gather short numeric information in self-administered web questionnaires. Respondents are encouraged to enter numbers, quantities or frequencies into input fields, most often without any computerized formatting constraints predominantly in order to prevent item nonresponse. However, the absence of any formatting restrictions encourages large variances in answers deviating from the desired format, including value ranges, estimations, alphanumeric supplements, or even different measuring units which affect data quality negatively, and increase the efforts for data cleansing and preparation. Thus, concise and clear formatting instructions are needed to guide respondents providing answers in the desired format. Considering the fact that instructions are likely to be ignored the question arises how different modes of verbal instructions and visual cues can be applied to improve the impact of formatting instructions, and finally to enhance data quality.
Methods & Data: In a between-subjects field experiment conducted among university freshman students in an opt-in panel (N=670), we tested different visual modes of formatting instructions in open-ended numeric questions: (1) conventional instruction in a static manner, (2) dynamic instruction in a tooltip appearing when the mouse cursor hovers over the input field, and (3) symbolic instruction in terms of pre-defined default values in the input field indicating the desired response format. The effectiveness of each instruction mode was determined by the proportion of formally correct answers.
Results: Findings indicated that the implementation of dynamic formatting elements in terms of tooltips or default values had no positive effect on an improvement of response quality compared to conventional static formatting instructions. Even a combination of tooltips and pre-filled symbols could not achieve a significant increase in correctly formatted answers compared to the sole presentation of a fixed instruction.
Added Value: The results indicated that static formatting instructions should not be replaced hastily without examining the effect of dynamic elements sufficiently. However, initial findings suggested the potential of dynamic formatting instructions in enhancing the positive effect of conventional instructions.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - Germany (361)
- Metadata on the demographics of online research: Results from a full-range study of available online...; 2013; Burger, C., Stieger, S.
- How the screen-out influence the dropout of a commercial panel; 2013; Bartoli, B.
- Beyond methodology - some ethical implications of "doing research online"; 2013; Heise, N.
- Innovation in Data Collection: the Responsive Design Approach; 2013; Bianchi, A., Biffignandi, S.
- Break-off and attrition in the GIP amongst technologically experienced and inexperienced participants...; 2013; Blom, A. G., Bossert, D., Clark, V., Funke, F., Gebhard, F., Holthausen, A., Krieger, U., Wachenfeld...
- Nonresponse and Nonresponse Bias in a Probability-Based Internet Panel; 2013; Blom, A. G., Bossert, D., Funke, F., Gebhard, F., Holthausen, A., Krieger, U.
- Rewards - Money for Nothing?; 2013; Cape, P. J., Martin, P.
- Effects of incentive reduction after a series of higher incentive waves in a probability-based online...; 2013; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- Timing of Nonparticipation in an Online Panel: The effect of incentive strategies; 2013; Douhou, S., Scherpenzeel, A.
- How Do Lotteries and Study Results Influence Response Behavior in Online Panels?; 2013; Goeritz, A., Luthe, S. C.
- Sample composition discrepancies in different stages of a probability-based online panel; 2013; Bosnjak, M., Haas, I., Galesic, M., Kaczmirek, L., Bandilla, W., Couper, M. P.
- Web-based data collection yielded an additional response bias—but had no direct effect on outcome...; 2012; Mayr, A., Gefeller, O., Prokosch, H.-U., Pirkl, A., Froehlich, A. de Zwaan, M.
- Passive measurement of online data in Practice - A White Paper Wakoopa; 2012
- Metering mobile usage. Insights from global Arbitron mobile trends panel; 2012; Verkasalo, H.
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Online Questionnaires: Development of ‘basic requirements’; 2012; Tries, S., Blanke, K.
- Pros and cons of Internet based User Satisfaction Surveys; 2012; Consoli, A., Matsulevits, L.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- WebSM Study: Survey software features overview ; 2012; Vehovar, V., Cehovin, G., Kavcic, L., Lenar, J.
- Challenges of assessing the quality of a prerecruited probability-based panel of internet users in...; 2012; Struminskaya, B., Kaczmirek, L.
- Assessing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys; 2012; Behr, D., Braun, M., Kaczmirek, L.
- Comparing Ranking Techniques in Web Surveys; 2012; Blasius, J.
- Design of CAWI Instruments for Social Surveys ; 2012; Blanke, K.
- Enhancing Web Surveys With New HTML5 Input Types; 2012; Funke, F.
- The German Internet Panel: First Results from the Recruitment Phases; 2012; Blom, A. G.
- Assessing the Magnitude of Non-Consent Biases in Linked Survey and Administrative Data; 2012; Sakshaug, J. W., Kreuter, F.
- Marktforschung mit dem iPad-Panel von Axel Springer Media Impact; 2012
- Effects of Personalized Versus Generic Implementation of an Intra-Organizational Online Survey on Psychological...; 2012; Mueller, K., Straatmann, T., Hattrup, K., Jochum, M.
- Exploring New Pathways to Survey Recruitment; 2012; Bilgram, V., Stadler, D.Jawecki, G.
- Does Mode Matter? Initial Evidence from the German Longitudinal Election Study (GLES); 2012; Blumenstiel, J. E., Rossmann, J.
- Surveytainment 2.0: Why investing 10 more minutes more in constructing your questionnaire is worth considering...; 2012; Muehle, A., Tress, F., Schmidt, S., Winkler, T.
- Market research online community (MROC) versus focus group; 2012; Zuber, M.
- Data quality in MAWI and CAWI; 2012; Mavletova, A. M., Blasius, J.
- Scrutinizing Dynamics – Rolling panel waves in theory and practice; 2012; Faas, T., Blumenberg, J. N.
- Little experience with technology as a cause of nonresponse in online surveys; 2012; Struminskaya, B., Schaurer, I., Kaczmirek, L., Bandilla, W.
- Continuous large-scale volunteer web-surveys: The experience of Lohnspiegel and WageIndicator; 2012; Oez, F.
- Is Pretesting Established Among Online Survey Tool Users?; 2012
- An Evaluation of Two Non-Reactive Web Questionnaire Pretesting Methods; 2012; Lenzner, T.
- High potential for mobile Web surveys: Findings from a survey representative for German Internet users...; 2012; Funke, F., Wachenfeld, A.
- Can Social Media Research replace traditional research methods?; 2012; Faber, T., Einhorn, M., Hofmann, O., Loeffler, M.
- Bad Boy Matrix Question – Whatcha gonna do when they come for you?; 2012; Tress, F.
- Effects of Static versus Dynamic Formatting Instructions for Open-Ended Numerical Questions in Web Surveys...; 2012; Kunz, T., Fuchs, M.
- FamilyVote – Conducting online surveys with children and families; 2012; Geissler, H., Peeters, H.
- Assessing the Quality of Survey Data ; 2012; Blasius, J.
- Exploring Animated Faces Scales in Web Surveys: Drawbacks and Prospects; 2012; Emde, M., Fuchs, M.
- Reminders in Web-Based Data Collection: Increasing Response at the Price of Retention?; 2012; Goeritz, A., Crutzen, R.
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Mixing modes in the LFS - Computer-assisted, cost effective and respondent friendly; 2011; Koerner, T., van der Valk, J.
- Establishing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys...; 2011; Braun, M., Behr, D., Kaczmirek, L.